Module - 4: Information Theory and Coding


On completion of this module, you will be able to:

  • Define information and discrete messages
  • Define amount of information
  • Explain Entropy and its property
  • Define information rate
  • Explain mutual information
  • Explain properties of mutual information

  • Introduction to Information

    Uncertainty & Probability

    Uncertainty ∝1 /Pevent

    Uncertainty = log (1 / Pevent)= - log Pevent

    Introduction to Discrete Messages

    1

    Fig. 2: Discrete Sampled Signals.

    Amount of Information

    4.5 Unit of Information

    Problem:

    A message signal mk is transmitted by a transmitter. The probability of occurrence of this signal is 1/2 .Calculate the information conveyed by it, in terms of bits, decit and nats.

    Solution :

    Introduction to Entropy

    Entropy for different probability

    Property of Entropy

    Joint Entropy

    Conditional Entropy

    Relationship between the conditional & joint Entropy

    Rate of Information

    Problem :

    Slution :

    Mutual Information

    For an ideal Noiseless Channel

    If channel noise is large which effect the 'y' and it is totally unrellated to 'x' then

    Average mutual information

    Expressin for mutual information

    Source Coding

    Channel Coding

    Discrete Memoryless Channel

    Code efficiency

    Forward Transition Matrix

    Binary symmetric channel (BSC)

    Channel Capacity

    Shannon-Fano Algorithm

    Following procedure is given below to obtain the code words for each message

    Problem :

    Prefix Coding

    Huffman Coding

    Example : A discrete memoryless source has five symbols with respective probability is given below. Find the Huffman code


    Solution:

    1
    MessageProbability Code Code Length
    a0.25002
    b0.25102
    c0.2112
    d0.150103
    e0.150113

    Shannon's theorem on channel capacity

    Shannon-hartley theorem

    Trade off betweeen bandwidth and signal to noise ratio (SNR)

    Noiseless channel has infinite capacity

    1

    Infinite BW has limited capacity

    Example :

    Solution :

    1

    Example :

    Solution :

    1

    Error Detection & Error Correction Codes

    Need for error control coding

    Linear Block Code

    Encoding of (n, k) linear block code

    Systematic Block Code (n, k)

    Decoding of (n, k) Linear Block Code

    Linear block codes applications

    Hamming code

    1

    Hamming distance

    Hamming weight

    Convolution Code

    Code Rate

    Constaint Length

    Code Dimension

    Advantages of convolution codes

    Disadvantages of convolution codes

    Applications of convolution codes